Dual Free Adaptive Minibatch SDCA for Empirical Risk Minimization
نویسندگان
چکیده
منابع مشابه
Dual Free SDCA for Empirical Risk Minimization with Adaptive Probabilities
In this paper we develop dual free SDCA with adaptive probabilities for regularized empirical risk minimization. This extends recent work of Shai Shalev-Shwartz [SDCA without Duality, arXiv:1502.06177] to allow non-uniform selection of ”dual” coordinate in SDCA. Moreover, the probability can change over time, making it more efficient than uniform selection. Our work focuses on generating adapti...
متن کاملDual Free Adaptive Mini-batch SDCA for Empirical Risk Minimization
In this paper we develop dual free mini-batch SDCA with adaptive probabilities for regularized empirical risk minimization. This work is motivated by recent work of Shai ShalevShwartz on dual free SDCA method, however, we allow a non-uniform selection of ”dual” coordinates in SDCA. Moreover, the probability can change over time, making it more efficient than fix uniform or non-uniform selection...
متن کاملSDNA: Stochastic Dual Newton Ascent for Empirical Risk Minimization
We propose a new algorithm for minimizing regularized empirical loss: Stochastic Dual Newton Ascent (SDNA). Our method is dual in nature: in each iteration we update a random subset of the dual variables. However, unlike existing methods such as stochastic dual coordinate ascent, SDNA is capable of utilizing all local curvature information contained in the examples, which leads to striking impr...
متن کاملStochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization
We consider a generic convex optimization problem associated with regularized empirical risk minimization of linear predictors. The problem structure allows us to reformulate it as a convex-concave saddle point problem. We propose a stochastic primal-dual coordinate method, which alternates between maximizing over one (or more) randomly chosen dual variable and minimizing over the primal variab...
متن کاملAdaptive Newton Method for Empirical Risk Minimization to Statistical Accuracy
We consider empirical risk minimization for large-scale datasets. We introduce Ada Newton as an adaptive algorithm that uses Newton’s method with adaptive sample sizes. The main idea of Ada Newton is to increase the size of the training set by a factor larger than one in a way that the minimization variable for the current training set is in the local neighborhood of the optimal argument of the...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Frontiers in Applied Mathematics and Statistics
سال: 2018
ISSN: 2297-4687
DOI: 10.3389/fams.2018.00033